32 research outputs found

    Multilevel Double Loop Monte Carlo and Stochastic Collocation Methods with Importance Sampling for Bayesian Optimal Experimental Design

    Full text link
    An optimal experimental set-up maximizes the value of data for statistical inferences and predictions. The efficiency of strategies for finding optimal experimental set-ups is particularly important for experiments that are time-consuming or expensive to perform. For instance, in the situation when the experiments are modeled by Partial Differential Equations (PDEs), multilevel methods have been proven to dramatically reduce the computational complexity of their single-level counterparts when estimating expected values. For a setting where PDEs can model experiments, we propose two multilevel methods for estimating a popular design criterion known as the expected information gain in simulation-based Bayesian optimal experimental design. The expected information gain criterion is of a nested expectation form, and only a handful of multilevel methods have been proposed for problems of such form. We propose a Multilevel Double Loop Monte Carlo (MLDLMC), which is a multilevel strategy with Double Loop Monte Carlo (DLMC), and a Multilevel Double Loop Stochastic Collocation (MLDLSC), which performs a high-dimensional integration by deterministic quadrature on sparse grids. For both methods, the Laplace approximation is used for importance sampling that significantly reduces the computational work of estimating inner expectations. The optimal values of the method parameters are determined by minimizing the average computational work, subject to satisfying the desired error tolerance. The computational efficiencies of the methods are demonstrated by estimating the expected information gain for Bayesian inference of the fiber orientation in composite laminate materials from an electrical impedance tomography experiment. MLDLSC performs better than MLDLMC when the regularity of the quantity of interest, with respect to the additive noise and the unknown parameters, can be exploited

    Mean-Field Games for Marriage

    Get PDF
    This article examines mean-field games for marriage. The results support the argument that optimizing the long-term well-being through effort and social feeling state distribution (mean-field) will help to stabilize marriage. However, if the cost of effort is very high, the couple fluctuates in a bad feeling state or the marriage breaks down. We then examine the influence of society on a couple using mean field sentimental games. We show that, in mean-field equilibrium, the optimal effort is always higher than the one-shot optimal effort. We illustrate numerically the influence of the couple's network on their feeling states and their well-being.Comment: 22 figures. Accepted and to appear in PLoS On

    Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Full text link
    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.Comment: 42 pages, 35 figure

    Nesterov-aided Stochastic Gradient Methods using Laplace Approximation for Bayesian Design Optimization

    Full text link
    Finding the best setup for experiments is the primary concern for Optimal Experimental Design (OED). Here, we focus on the Bayesian experimental design problem of finding the setup that maximizes the Shannon expected information gain. We use the stochastic gradient descent and its accelerated counterpart, which employs Nesterov's method, to solve the optimization problem in OED. We adapt a restart technique, originally proposed for the acceleration in deterministic optimization, to improve stochastic optimization methods. We combine these optimization methods with three estimators of the objective function: the double-loop Monte Carlo estimator (DLMC), the Monte Carlo estimator using the Laplace approximation for the posterior distribution (MCLA) and the double-loop Monte Carlo estimator with Laplace-based importance sampling (DLMCIS). Using stochastic gradient methods and Laplace-based estimators together allows us to use expensive and complex models, such as those that require solving partial differential equations (PDEs). From a theoretical viewpoint, we derive an explicit formula to compute the gradient estimator of the Monte Carlo methods, including MCLA and DLMCIS. From a computational standpoint, we study four examples: three based on analytical functions and one using the finite element method. The last example is an electrical impedance tomography experiment based on the complete electrode model. In these examples, the accelerated stochastic gradient descent method using MCLA converges to local maxima with up to five orders of magnitude fewer model evaluations than gradient descent with DLMC.Comment: 36 pages, 14 figure

    Development and validation of a food photography manual, as a tool for estimation of food portion size in epidemiological dietary surveys in Tunisia

    Get PDF
    Background: Estimation of food portion sizes has always been a challenge in dietary studies on free-living individuals. The aim of this work was to develop and validate a food photography manual to improve the accuracy of the estimated size of consumed food portions.Methods: A manual was compiled from digital photos of foods commonly consumed by the Tunisian population. The food was cooked and weighed before taking digital photographs of three portion sizes. The manual was validated by comparing the method of 24-hour recall (using photos) to the reference method [food weighing (FW)]. In both the methods, the comparison focused on food intake amounts as well as nutritional issues. Validity was assessed by BlandAltman limits of agreement. In total, 31 male and female volunteers aged 989 participated in the study.Results: We focused on eight food categories and compared their estimated amounts (using the 24-hour recall method) to those actually consumed (using FW). Animal products and sweets were underestimated, whereas pasta, bread, vegetables, fruits, and dairy products were overestimated. However, the difference between the two methods is not statistically significant except for pasta (pB0.05) and dairy products (pB0.05). The coefficient of correlation between the two methods is highly significant, ranging from 0.876 for pasta to 0.989 for dairy products. Nutrient intake calculated for both methods showed insignificant differences except for fat (pB0.001) and dietary fiber (pB0.05). A highly significant correlation was observed between the two methods for all micronutrients. The test agreement highlights the lack of difference between the two methods.Conclusion: The difference between the 24-hour recall method using digital photos and the weighing method is acceptable. Our findings indicate that the food photography manual can be a useful tool for quantifying food portion sizes in epidemiological dietary surveys.Keywords: food portion sizes; Tunisia; weighed foods; 24-hour recall; portion size photographs; portion size estimatio

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    A generalized finite difference method for the 2-D nonlinear shallow water equations

    No full text
    International audienceIn this paper, we propose a generalized finite difference method for two-dimensional non-linear shallow equations. The space discretization uses the staggered grid C of Arakawa. Beside the implicit-explicit factor theta, the time discretization involves a balance ratio alpha of the spatial nodes. The stability analysis takes account the size of the parameters. We discuss the stabilizing properties of the scheme and present some numerical experiments

    Boundary Feedback Stabilization of Two-Dimensional Shallow Water Equations with Viscosity Term

    No full text
    This paper treats a water flow regularization problem by means of local boundary conditions for the two-dimensional viscous shallow water equations. Using an a-priori energy estimate of the perturbation state and the Faedo–Galerkin method, we build a stabilizing boundary feedback control law for the volumetric flow in a finite time that is prescribed by the solvability of the associated Cauchy problem. We iterate the same approach to build by cascade a stabilizing feedback control law for infinite time. Thanks to a positive arbitrary time-dependent stabilization function, the control law provides an exponential decay of the energy

    A generalized finite difference method for the 2-D nonlinear shallow water equations

    No full text
    International audienceIn this paper, we propose a generalized finite difference method for two-dimensional non-linear shallow equations. The space discretization uses the staggered grid C of Arakawa. Beside the implicit-explicit factor theta, the time discretization involves a balance ratio alpha of the spatial nodes. The stability analysis takes account the size of the parameters. We discuss the stabilizing properties of the scheme and present some numerical experiments
    corecore